Literaturnachweis - Detailanzeige
Autor/inn/en | Hilton, Lara G.; Azzam, Tarek |
---|---|
Titel | Crowdsourcing Qualitative Thematic Analysis |
Quelle | In: American Journal of Evaluation, 40 (2019) 4, S.575-589 (15 Seiten)Infoseite zur Zeitschrift
PDF als Volltext |
Zusatzinformation | ORCID (Hilton, Lara G.) |
Sprache | englisch |
Dokumenttyp | gedruckt; online; Zeitschriftenaufsatz |
ISSN | 1098-2140 |
DOI | 10.1177/1098214019836674 |
Schlagwörter | Qualitative Research; Stakeholders; Data Collection; Chronic Illness; Pain; Expertise; Patients; Coding; Feedback (Response); Evaluation Methods |
Abstract | Evaluations that include stakeholders aim to understand their perspectives and to ensure that their views are represented. This article offers a new approach to gaining stakeholder perspectives through crowdsourcing. We recruited a sample of individuals with chronic low back pain through a crowdsourcing site. This sample coded textual data describing pain, provided themes, and provided feedback on constructs and procedures. The results generated by the crowdsourced participants were compared to results generated by experts. We found that crowdsourcing to develop code and textual responses was feasible, rapid, and inexpensive, offering the potential to enhance patient stakeholder engagement in evaluation. Crowdsourcing has broad implications for evaluation science beyond the health sector. (As Provided). |
Anmerkungen | SAGE Publications. 2455 Teller Road, Thousand Oaks, CA 91320. Tel: 800-818-7243; Tel: 805-499-9774; Fax: 800-583-2665; e-mail: journals@sagepub.com; Web site: http://sagepub.com |
Erfasst von | ERIC (Education Resources Information Center), Washington, DC |
Update | 2020/1/01 |